Bias Correction with Jackknife, Bootstrap, and Taylor Series

نویسندگان

  • Jiantao Jiao
  • Yanjun Han
  • Tsachy Weissman
چکیده

We analyze the bias correction methods using jackknife, bootstrap, and Taylor series. We focus on the binomial model, and consider the problem of bias correction for estimating f(p), where f ∈ C[0, 1] is arbitrary. We characterize the supremum norm of the bias of general jackknife and bootstrap estimators for any continuous functions, and demonstrate the in deleted jackknife, different values of d may lead to drastically different behavior in jackknife. We show that in the binomial model, iterating the bootstrap bias correction infinitely many times may lead to divergence of bias and variance, and demonstrate that the bias properties of the bootstrap bias corrected estimator after r−1 rounds is exactly the same as that of the r-jackknife estimator if a bounded coefficients condition is satisfied.

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Higher Order Properties of Bootstrap and Jackknife Bias Corrected Maximum Likelihood Estimators

Pfanzagl and Wefelmeyer (1978) show that bias corrected ML estimators are higher order efficient. Their procedure however is computationally complicated because it requires integrating complicated functions over the distribution of the MLE estimator. The purpose of this paper is to show that these integrals can be replaced by sample averages without affecting the higher-order variance. We focus...

متن کامل

Effects of Bagging and Bias Correction on Estimators Defined by Estimating Equations

Bagging an estimator approximately doubles its bias through the impact of bagging on quadratic terms in expansions of the estimator. This difficulty can be alleviated by bagging a suitably bias-corrected estimator, however. In these and other circumstances, what is the overall impact of bagging and/or bias correction, and how can it be characterised? We answer these questions in the case of gen...

متن کامل

Bias Evaluation in Theproportional Hazards Modele

We consider two approaches for bias evaluation and reduction in the proportional hazards model (PHM) proposed by Cox. The rst one is an analytical approach in which we derive the n ?1 bias term of the maximum partial likelihood estimator. The second approach consists of resampling methods, namely the jackknife and the bootstrap. We compare all methods through a comprehensive set of Monte Carlo ...

متن کامل

A jackknife type approach to statistical model selection

Procedures such as Akaike information criterion (AIC), Bayesian information criterion (BIC), minimum description length (MDL), and bootstrap information criterion have been developed in the statistical literature for model selection. Most of these methods use estimation of bias. This bias, which is inevitable in model selection problems, arises from estimating the distance between an unknown tr...

متن کامل

Parametric bootstrap methods for bias correction in linear mixed models

The empirical best linear unbiased predictor (EBLUP) in the linear mixed model (LMM) is useful for the small area estimation, and the estimation of the mean squared error (MSE) of EBLUP is important as a measure of uncertainty of EBLUP. To obtain a second-order unbiased estimator of the MSE, the second-order bias correction has been derived mainly based on Taylor series expansions. However, thi...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:
  • CoRR

دوره abs/1709.06183  شماره 

صفحات  -

تاریخ انتشار 2017